Self-control in Sparsely Coded Networks
نویسندگان
چکیده
A complete self-control mechanism is proposed in the dynamics of neural networks through the introduction of a time-dependent threshold, determined in function of both the noise and the pattern activity in the network. Especially for sparsely coded models this mechanism is shown to considerably improve the storage capacity, the basins of attraction and the mutual information content of the network. Sparsely coded models have attracted a lot of attention in the development of neural networks, both from the device oriented and biologically oriented point of view [1]– [2]. It is well-known that they have a large storage capacity, which behaves as 1/(a ln a) for a small where a is the pattern activity. However, it is clear that the basins of attraction, e.g., should not become too small because then sparse coding is, in fact, useless. In this context the necessity of an activity control system has been emphasized, which tries to keep the activity of the network in the retrieval process the same as the one for the memorized patterns [3]– [5]. This has led to several discussions imposing external constraints on the dynamics (see the references in [2]). Clearly, the enforcement of such a constraint at every time step destroys part of the autonomous functioning of the network. An important question is then whether the capacity of storage and retrieval with non-negligible basins of attraction can be improved and even be optimized without imposing these external constraints, keeping at the same time the simplicity of the architecture of the network. In this Letter we answer this question by proposing , as far as we are aware for the first time, a complete self-control mechanism in the dynamics of neural networks. This is done through the introduction of a time-dependent threshold in the transfer function. This threshold is chosen as a function of the noise in the system and the pattern activity, and adapts itself in the course of the time evolution. The difference with existing results in the literature [2] precisely lies in this adaptiv-ity property. This immediately solves, e.g., the difficult problem of finding the mostly narrow interval for an optimal threshold such that the basins of attraction of the memorized patterns do not shrink to zero. We have worked out the practical case of sparsely coded models. We find that the storage capacity, the basins of attraction as well as the mutual information content are improved. …
منابع مشابه
Retrieval dynamics of neural networks for sparsely coded sequential patterns
It is well known that a sparsely coded network in which the activity level is extremely low has intriguing equilibrium properties. In the present work, we study the dynamical properties of a neural network designed to store sparsely coded sequential patterns rather than static ones. Applying the theory of statistical neurodynamics, we derive the dynamical equations governing the retrieval proce...
متن کاملMutual Information of Three-State Low Activity Diluted Neural Networks with Self-Control
The influence of a macroscopic time-dependent threshold on the retrieval dynamics of attractor associative memory models with ternary neurons {−1, 0.+ 1} is examined. If the threshold is chosen appropriately in function of the crosstalk noise and of the activity of the memorized patterns in the model, adapting itself in the course of the time evolution, it guarantees an autonomous functioning o...
متن کاملAutonomous Perceptual Feature Extraction in a Topology-Constrained Architecture
In this paper, it is shown that the Feature-Extracting Bidirectional Associative Memory (FEBAM) can encompass competitive model features based on winner-take-all, kwinners-take-all and self-organizing feature map properties. The modified model achieves perceptual multidimensional feature extraction, cluster-based category formation through simultaneous creation of prototype/exemplar memories, a...
متن کاملProperties of Associative Memory Neural Networks concerning Biological Information Encoding
Associative abilities of the neural networks concerning the information coding in the real neural systems are studied. Two models which we adopted are the sparsely coded neural network and the oscillator neural network. We theoretically analyze such models with the replica theory and the theory of the statistical neurodynamics. These theories enable us to describe the states of the systems whic...
متن کاملIterative retrieval of sparsely coded associative memory patterns
We investigate the pattern completion performance of neural auto-associative memories composed of binary threshold neurons for sparsely coded binary memory patterns. Focussing on iterative retrieval, eeective threshold control strategies are introduced. These are investigated by means of computer simulation experiments and analytical treatment. To evaluate the systems performance we consider th...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1998